Snap to launch lightweight, immersive specs in 2026
Getting your Trinity Audio player ready...
|
Snap Inc. announced at the Augmented World Expo 2025 it is launching lightweight, immersive Specs in 2026. Specs are a wearable computer integrated into a lightweight pair of glasses, featuring see-through Lenses that enhance the physical world with digital experiences.
Snap’s all-new Specs are uniquely positioned to understand the world through advanced machine learning, bring AI assistance into three-dimensional space, enable shared games and experiences with friends, and provide a flexible and powerful workstation for browsing, streaming, and more, the company claims.
“We believe the time is right for a revolution in computing that naturally integrates our digital experiences with the physical world, and we can’t wait to publicly launch our new Specs next year,” said Evan Spiegel, co-founder and CEO of Snap Inc. “We couldn’t be more excited about the extraordinary progress in artificial intelligence and augmented reality that is enabling new, human-centered computing experiences. We believe Specs are the most advanced personal computer in the world, and we can’t wait for you to see for yourself.”

People use AR Lenses in the Snapchat camera 8 billion times per day, and over 400,000 developers have built more than 4 million Lenses with Snap’s world-leading AR tools. Snap released its fifth generation of Spectacles for developers in 2024, paving the way for the public launch of Specs in 2026.
The company also announced major updates to Snap OS, building on feedback and suggestions from its developer community:
- Deep Integrations with OpenAI and Gemini on Google Cloud: Snap now enables developers to build multimodal AI-powered Lenses and publish them for the Spectacles community. For example, developers are using AI to provide text translation and currency conversion (Super Travel), suggest recipes (Cookmate), and bring you on whimsical adventures (Wisp World) based on what you see, say, or hear while wearing Spectacles. We offer camera access designed with privacy in mind through our proprietary Remote Service Gateway.
- Depth Module API: Translates 2D information from large language models to anchor AR information accurately in three dimensions, unlocking a new paradigm for spatial intelligence.
- Automated Speech Recognition API: Enables real-time transcription with support for 40+ languages including non-native accents with high accuracy.
- Snap3D API: Lets developers generate 3D objects on the fly inside Lenses.
Snap is also launching new tools specifically for developers building location-based experiences, making it easier to bring monuments, museums, and more to life:
- Fleet Management app: Enables developers to remotely monitor and manage multiple pairs of Specs.
- Guided Mode: Developers can configure Specs to launch directly into a single-player or multiplayer Lens quickly for a seamless visitor experience.
- Guided Navigation: This feature makes it easy to build AR-guided tours that direct people through a series of landmarks at events or museums.